-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Change cursors batch time formula on presence update #333
Conversation
@JoaoDiasAbly given this is a public repo, please can you add a description of the change directly, since not all people will be able to see the linked ticket. |
@@ -105,7 +105,7 @@ export default class Cursors extends EventEmitter<CursorsEventMap> { | |||
const channel = this.getChannel(); | |||
const cursorsMembers = await channel.presence.get(); | |||
this.cursorBatching.setShouldSend(cursorsMembers.length > 1); | |||
this.cursorBatching.setBatchTime(cursorsMembers.length * this.options.outboundBatchInterval); | |||
this.cursorBatching.setBatchTime(Math.ceil(cursorsMembers.length / 100) * this.options.outboundBatchInterval); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's hard to understand why this is divided by 100 now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Before we were adjusting with cursorsMembers.length * this.options.outboundBatchInterval
, which made the client-side latency climb linearly with the presence set. Since we have server-side batching now we don't need to make it so steep client-side. Therefore, the ticket suggestion was that it could climb with groups of 100s, which makes client-side batching latency grow much slower.
Example assuming outboundBatchInterval=25ms:
- if #presenceMembers <= 100: batchTime=25ms
- if #presenceMembers <= 200: batchTime=50ms
- if #presenceMembers <= 300: batchTime=75ms
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok thanks, makes sense, perhaps put that as a comment in the code?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added as a comment in the code
867a674
to
f2f239a
Compare
https://ably.atlassian.net/browse/REA-1868
This PR changes the rate at which we adapt the (client-side) batch time, multiplying the
outboundBatchInterval
by slots of 100s instead of multiplying this value with the number of presence members.This change and smoothing of client-side batching interval is incentivised by the addition of server-side batching, which is now enabled by default for cursors channels.